Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
WISER: Multimodal variational inference for full-waveform inversion without dimensionality reductionWe develop a semiamortized variational inference (VI) framework designed for computationally feasible uncertainty quantification in full-waveform inversion to explore the multimodal posterior distribution without dimensionality reduction. The framework is called full-waveform VI via subsurface extensions with refinements (WISER). WISER builds on top of a supervised generative artificial intelligence method that performs approximate amortized inference that is low-cost albeit showing an amortization gap. This gap is closed through nonamortized refinements that make frugal use of wave physics. Case studies illustrate that WISER is capable of full-resolution, computationally feasible, and reliable uncertainty estimates of velocity models and imaged reflectivities.more » « less
-
SUMMARY Geological Carbon Storage (GCS) is one of the most viable climate-change mitigating net-negative CO2-emission technologies for large-scale CO2 sequestration. However, subsurface complexities and reservoir heterogeneity demand a systematic approach to uncertainty quantification to ensure both containment and conformance, as well as to optimize operations. As a step toward a digital twin for monitoring and control of underground storage, we introduce a new machine-learning-based data-assimilation framework validated on realistic numerical simulations. The proposed digital shadow combines simulation-based inference (SBI) with a novel neural adaptation of a recently developed nonlinear ensemble filtering technique. To characterize the posterior distribution of CO2 plume states (saturation and pressure) conditioned on multimodal time-lapse data, consisting of imaged surface seismic and well-log data, a generic recursive scheme is employed, where neural networks are trained on simulated ensembles for the time-advanced state and observations. Once trained, the digital shadow infers the state as time-lapse field data become available. Unlike ensemble Kalman filtering, corrections to predicted states are computed via a learned nonlinear prior-to-posterior mapping that supports non-Gaussian statistics and nonlinear models for the dynamics and observations. Training and inference are facilitated by the combined use of conditional invertible neural networks and bespoke physics-based summary statistics. Starting with a probabilistic permeability model derived from a baseline seismic survey, the digital shadow is validated against unseen simulated ground-truth time-lapse data. Results show that injection-site-specific uncertainty in permeability can be incorporated into state uncertainty, and the highest reconstruction quality is achieved when conditioning on both seismic and wellbore data. Despite incomplete permeability knowledge, the digital shadow accurately tracks the subsurface state throughout a realistic CO2 injection project. This work establishes the first proof-of-concept for an uncertainty-aware, scalable digital shadow, laying the foundation for a digital twin to optimize underground storage operations.more » « less
-
Abstract Due to their uncertainty quantification, Bayesian solutions to inverse problems are the framework of choice in applications that are risk averse. These benefits come at the cost of computations that are in general, intractable. New advances in machine learning and variational inference (VI) have lowered this computational barrier by leveraging data-driven learning. Two VI paradigms have emerged that represent different tradeoffs: amortized and non-amortized. Amortized VI can produce fast results but due to generalizing to many observed datasets it produces suboptimal inference results. Non-amortized VI is slower at inference but finds better posterior approximations since it is specialized towards a single observed dataset. Current amortized VI techniques run into a sub-optimality wall that cannot be improved without more expressive neural networks or extra training data. We present a solution that enables iterative improvement of amortized posteriors that uses the same networks architectures and training data. The benefits of our method requires extra computations but these remain frugal since they are based on physics-hybrid methods and summary statistics. Importantly, these computations remain mostly offline thus our method maintains cheap and reusable online evaluation while bridging the optimality gap between these two paradigms. We denote our proposed methodASPIRE-Amortized posteriors withSummaries that arePhysics-based andIterativelyREfined. We first validate our method on a stylized problem with a known posterior then demonstrate its practical use on a high-dimensional and nonlinear transcranial medical imaging problem with ultrasound. Compared with the baseline and previous methods in the literature, ASPIRE stands out as an computationally efficient and high-fidelity method for posterior inference.more » « less
-
We introduce a probabilistic technique for full-waveform inversion, using variational inference and conditional normalizing flows to quantify uncertainty in migration-velocity models and its impact on imaging. Our approach integrates generative artificial intelligence with physics-informed common-image gathers, reducing reliance on accurate initial velocity models. Considered case studies demonstrate its efficacy producing realizations of migration-velocity models conditioned by the data. These models are used to quantify amplitude and positioning effects during subsequent imaging.more » « less
-
Normalizing flows is a density estimation method that provides efficient exact likelihood estimation and sampling (Dinh et al., 2014) from high-dimensional distributions. This method depends on the use of the change of variables formula, which requires an invertible transform. Thus normalizing flow architectures are built to be invertible by design (Dinh et al., 2014). In theory, the invertibility of architectures constrains the expressiveness, but the use of coupling layers allows normalizing flows to exploit the power of arbitrary neural networks, which do not need to be invertible, (Dinh et al., 2016) and layer invertibility means that, if properly implemented, many layers can be stacked to increase expressiveness without creating a training memory bottleneck. The package we present, InvertibleNetworks.jl, is a pure Julia (Bezanson et al., 2017) imple- mentation of normalizing flows. We have implemented many relevant neural network layers, including GLOW 1x1 invertible convolutions (Kingma & Dhariwal, 2018), affine/additive coupling layers (Dinh et al., 2014), Haar wavelet multiscale transforms (Haar, 1909), and Hierarchical invertible neural transport (HINT) (Kruse et al., 2021), among others. These modular layers can be easily composed and modified to create different types of normalizing flows. As starting points, we have implemented RealNVP, GLOW, HINT, Hyperbolic networks (Lensink et al., 2022) and their conditional counterparts for users to quickly implement their individual applications.more » « less
-
The industry is experiencing significant changes due to artificial intelligence (AI) and the challenges of the energy transition. While some view these changes as threats, recent advances in AI offer unique opportunities, especially in the context of “digital twins” for subsurface monitoring and control.more » « less
-
Modern-day reservoir management and monitoring of geologic carbon storage increasingly call for costly time-lapse seismic data collection. We demonstrate how techniques from graph theory can be used to optimize acquisition geometries for low-cost sparse 4D seismic data. Based on midpoint-offset-domain connectivity arguments, our algorithm automatically produces sparse nonreplicated time-lapse acquisition geometries that favor wavefield recovery.more » « less
-
Geologic carbon storage represents one of the few truly scalable technologies capable of reducing the CO 2 concentration in the atmosphere. While this technology has the potential to scale, its success hinges on our ability to mitigate its risks. An important aspect of risk mitigation concerns assurances that the injected CO 2 remains within the storage complex. Among the different monitoring modalities, seismic imaging stands out due to its ability to attain high-resolution and high-fidelity images. However, these superior features come at prohibitive costs and time-intensive efforts that potentially render extensive seismic monitoring undesirable. To overcome this shortcoming, we present a methodology in which time-lapse images are created by inverting nonreplicated time-lapse monitoring data jointly. By no longer insisting on replication of the surveys to obtain high-fidelity time-lapse images and differences, extreme costs and time-consuming labor are averted. To demonstrate our approach, hundreds of realistic synthetic noisy time-lapse seismic data sets are simulated that contain imprints of regular CO 2 plumes and irregular plumes that leak. These time-lapse data sets are subsequently inverted to produce time-lapse difference images that are used to train a deep neural classifier. The testing results show that the classifier is capable of detecting CO 2 leakage automatically on unseen data with reasonable accuracy. We consider the use of this classifier as a first step in the development of an automatic workflow designed to handle the large number of continuously monitored CO 2 injection sites needed to help combat climate change.more » « less
An official website of the United States government
